Cubic regularization of Newton method and its global performance
نویسندگان
چکیده
In this paper, we provide theoretical analysis for a cubic regularization of Newton method as applied to unconstrained minimization problem. For this scheme, we prove general local convergence results. However, the main contribution of the paper is related to global worst-case complexity bounds for different problem classes including some nonconvex cases. It is shown that the search direction can be computed by standard linear algebra technique.
منابع مشابه
Cubic regularization of Newton’s method for convex problems with constraints
In this paper we derive efficiency estimates of the regularized Newton’s method as applied to constrained convex minimization problems and to variational inequalities. We study a one-step Newton’s method and its multistep accelerated version, which converges on smooth convex problems as O( 1 k3 ), where k is the iteration counter. We derive also the efficiency estimate of a second-order scheme ...
متن کاملLANCS Workshop on Modelling and Solving Complex Optimisation Problems
Towards optimal Newton-type methods for nonconvex smooth optimization Coralia Cartis Coralia.Cartis (at) ed.ac.uk School of Mathematics, Edinburgh University We show that the steepest-descent and Newton methods for unconstrained non-convex optimization, under standard assumptions, may both require a number of iterations and function evaluations arbitrarily close to the steepest-descent’s global...
متن کاملErratum to: A regularized Newton method without line search for unconstrained optimization
For unconstrained optimization, Newton-type methods have good convergence properties, and areused in practice. The Newton’s method combined with a trust-region method (the TR-Newtonmethod), the cubic regularization of Newton’s method and the regularized Newton method withline search methods are such Newton-type methods. The TR-Newton method and the cubic regu-larization of N...
متن کاملStochastic Variance-Reduced Cubic Regularized Newton Method
We propose a stochastic variance-reduced cubic regularized Newton method for non-convex optimization. At the core of our algorithm is a novel semi-stochastic gradient along with a semi-stochastic Hessian, which are specifically designed for cubic regularization method. We show that our algorithm is guaranteed to converge to an ( , √ )-approximately local minimum within Õ(n/ ) second-order oracl...
متن کاملInterior-point Methods for Nonconvex Nonlinear Programming: Primal-dual Methods and Cubic Regularization
In this paper, we present a primal-dual interior-point method for solving nonlinear programming problems. It employs a Levenberg-Marquardt (LM) perturbation to the Karush-Kuhn-Tucker (KKT) matrix to handle indefinite Hessians and a line search to obtain sufficient descent at each iteration. We show that the LM perturbation is equivalent to replacing the Newton step by a cubic regularization ste...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Math. Program.
دوره 108 شماره
صفحات -
تاریخ انتشار 2006